翻訳と辞書
Words near each other
・ Slick (surname)
・ Slick (tool)
・ Slick (wrestling)
・ Slick Aguilar
・ Slick Aircraft Slick 360
・ Slick Airways
・ Slentho
・ SLEP
・ SLEPc
・ Slepcev Storch
・ Slependen
・ Slependen Station
・ Slepi potnik
・ Slepian
・ Slepian's lemma
Slepian–Wolf coding
・ Slepotice
・ Slepčany
・ Slepče
・ Slepčević
・ Slepšek
・ Slerp
・ SLES
・ Slesarev Svyatogor
・ Slesinger
・ Slesse Creek
・ Slesse Mountain
・ Slesser
・ Slessor
・ Slessor Glacier


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Slepian–Wolf coding : ウィキペディア英語版
Slepian–Wolf coding
__NOTOC__
In information theory and communication, the Slepian–Wolf coding, also known as the Slepian–Wolf bound, is a fundamental result in distributed source coding discovered by David Slepian and Jack Wolf in 1973. It is a method of theoretically coding two lossless compressed correlated sources.
Distributed coding is the coding of two, in this case, or more dependent sources with separate encoders and a joint decoder. Given two statistically dependent i.i.d. finite-alphabet random sequences X and Y, the Slepian–Wolf theorem includes theoretical bound for the lossless coding rate for distributed coding of the two sources as shown below:
: R_X\geq H(X|Y), \,
: R_Y\geq H(Y|X), \,
: R_X+R_Y\geq H(X,Y). \,
If both the encoder and the decoder of the two sources are independent, the lowest rate it can achieve for lossless compression is H(X) and H(Y) for X and Y respectively, where H(X) and H(Y) are the entropies of X and Y. However, with joint decoding, if vanishing error probability for long sequences is accepted, the Slepian–Wolf theorem shows that much better compression rate can be achieved. As long as the total rate of X and Y is larger than their joint entropy H(X,Y) and none of the sources is encoded with a rate larger than its entropy, distributed coding can achieve arbitrarily small error probability for long sequences.
A special case of distributed coding is compression with decoder side information, where source Y is available at the decoder side but not accessible at the encoder side. This can be treated as the condition that R_Y=H(Y) has already been used to encode Y, while we intend to use H(X|Y) to encode X. In other words, two isolated sources can compress data as efficiently as if they were communicating with each other. The whole system is operating in an asymmetric way (compression rate for the two sources are asymmetric).
This bound has been extended to the case of more than two correlated sources by Thomas M. Cover in 1975, and similar results were obtained in 1976 by Aaron D. Wyner and Jacob Ziv with regard to lossy coding of joint Gaussian sources.
==See also==

* Data compression
* Data synchronization
*
* Synchronization (computer science)
* DISCUS
* Timeline of information theory

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Slepian–Wolf coding」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.